Robustness Properties of Dimensionality Reduction with Gaussian Random Matrices

نویسندگان

  • Bin Han
  • Zhiqiang Xu
چکیده

In this paper we study the robustness properties of dimensionality reduction with Gaussian random matrices having arbitrarily erased rows. We first study the robustness property against erasure for the almost norm preservation property of Gaussian random matrices by obtaining the optimal estimate of the erasure ratio for a small given norm distortion rate. As a consequence, we establish the robustness property of Johnson-Lindenstrauss lemma and the robustness property of restricted isometry property with corruption for Gaussian random matrices. Secondly, we obtain a sharp estimate for the optimal lower and upper bounds of norm distortion rates of Gaussian random matrices under a given erasure ratio. This allows us to establish the strong restricted isometry property with the almost optimal RIP constants, which plays a central role in the study of phaseless compressed sensing.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Isometric sketching of any set via the Restricted Isometry Property

In this paper we show that for the purposes of dimensionality reduction certain class of structured random matrices behave similarly to random Gaussian matrices. This class includes several matrices for which matrix-vector multiply can be computed in log-linear time, providing efficient dimensionality reduction of general sets. In particular, we show that using such matrices any set from high d...

متن کامل

Small Sample Size in High Dimensional Space - Minimum Distance Based Classification

In this paper we present some new results concerning the classification in small sample high dimensional case. We discuss geometric properties of data structures in high dimensions. It is known that such a data form in high dimension an almost regular simplex even if co-variance structure of data is not unity. We restrict our attention to two class discrimination problems. It is assumed that ob...

متن کامل

Gaussian Process Latent Random Field

The Gaussian process latent variable model (GPLVM) is an unsupervised probabilistic model for nonlinear dimensionality reduction. A supervised extension, called discriminative GPLVM (DGPLVM), incorporates supervisory information into GPLVM to enhance the classification performance. However, its limitation of the latent space dimensionality to at most C − 1 (C is the number of classes) leads to ...

متن کامل

Joint low-rank approximation for extracting non-Gaussian subspaces

In this article, we consider high-dimensional data which contains a low-dimensional non-Gaussian structure contaminated with Gaussian noise. Motivated by the joint diagonalization algorithms, we propose a linear dimension reduction procedure called joint low-dimensional approximation (JLA) to identify the non-Gaussian subspace. The method uses matrices whose non-zero eigen spaces coincide with ...

متن کامل

The Unreasonable Effectiveness of Random Orthogonal Embeddings

We present a general class of embeddings based on structured random matrices with orthogonal rows which can be applied in many machine learning applications including dimensionality reduction, kernel approximation and locality-sensitive hashing. We show that this class yields improvements over previous stateof-the-art methods either in computational efficiency (while providing similar accuracy)...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1501.01695  شماره 

صفحات  -

تاریخ انتشار 2015